Goto

Collaborating Authors

 learning efficient task-dependent representation


Learning efficient task-dependent representations with synaptic plasticity

Neural Information Processing Systems

Neural populations encode the sensory world imperfectly: their capacity is limited by the number of neurons, availability of metabolic and other biophysical resources, and intrinsic noise. The brain is presumably shaped by these limitations, improving efficiency by discarding some aspects of incoming sensory streams, while preferentially preserving commonly occurring, behaviorally-relevant information. Here we construct a stochastic recurrent neural circuit model that can learn efficient, task-specific sensory codes using a novel form of reward-modulated Hebbian synaptic plasticity. We illustrate the flexibility of the model by training an initially unstructured neural network to solve two different tasks: stimulus estimation, and stimulus discrimination. The network achieves high performance in both tasks by appropriately allocating resources and using its recurrent circuitry to best compensate for different levels of noise. We also show how the interaction between stimulus priors and task structure dictates the emergent network representations.



Learning efficient task-dependent representations with synaptic plasticity

Neural Information Processing Systems

Neural populations encode the sensory world imperfectly: their capacity is limited by the number of neurons, availability of metabolic and other biophysical resources, and intrinsic noise. The brain is presumably shaped by these limitations, improving efficiency by discarding some aspects of incoming sensory streams, while preferentially preserving commonly occurring, behaviorally-relevant information. Here we construct a stochastic recurrent neural circuit model that can learn efficient, task-specific sensory codes using a novel form of reward-modulated Hebbian synaptic plasticity. We illustrate the flexibility of the model by training an initially unstructured neural network to solve two different tasks: stimulus estimation, and stimulus discrimination. The network achieves high performance in both tasks by appropriately allocating resources and using its recurrent circuitry to best compensate for different levels of noise.


Review for NeurIPS paper: Learning efficient task-dependent representations with synaptic plasticity

Neural Information Processing Systems

This paper proposes a stochastic recurrent neural network that builds up its local information representation through a learning rule based on Boltzmann machines, but weighted by a task-dependent objective function, forming a so-called tri-factor learning rule. The results show how the network depends on the tasks of regression and classification in terms of the distribution of the tuning curves, population-averaged activities, and dependence on stimulus priors. The paper then considered how noises are redistributed in the neural manifold such that task performance can be achieved. Reviewers were overall positively predisposed towards this submission. Strengths include the coherent derivation of the proposed learning rule and the thorough analysis of its properties.


Learning efficient task-dependent representations with synaptic plasticity

Neural Information Processing Systems

Neural populations encode the sensory world imperfectly: their capacity is limited by the number of neurons, availability of metabolic and other biophysical resources, and intrinsic noise. The brain is presumably shaped by these limitations, improving efficiency by discarding some aspects of incoming sensory streams, while preferentially preserving commonly occurring, behaviorally-relevant information. Here we construct a stochastic recurrent neural circuit model that can learn efficient, task-specific sensory codes using a novel form of reward-modulated Hebbian synaptic plasticity. We illustrate the flexibility of the model by training an initially unstructured neural network to solve two different tasks: stimulus estimation, and stimulus discrimination. The network achieves high performance in both tasks by appropriately allocating resources and using its recurrent circuitry to best compensate for different levels of noise.

  learning efficient task-dependent representation, noise, synaptic plasticity